3,266 research outputs found
Low Power, Low Delay: Opportunistic Routing meets Duty Cycling
Traditionally, routing in wireless sensor networks consists of
two steps: First, the routing protocol selects a next hop,
and, second, the MAC protocol waits for the intended destination
to wake up and receive the data. This design makes
it difficult to adapt to link dynamics and introduces delays
while waiting for the next hop to wake up.
In this paper we introduce ORW, a practical opportunistic
routing scheme for wireless sensor networks. In a dutycycled
setting, packets are addressed to sets of potential receivers
and forwarded by the neighbor that wakes up first
and successfully receives the packet. This reduces delay and
energy consumption by utilizing all neighbors as potential
forwarders. Furthermore, this increases resilience to wireless
link dynamics by exploiting spatial diversity. Our results
show that ORW reduces radio duty-cycles on average
by 50% (up to 90% on individual nodes) and delays by 30%
to 90% when compared to the state of the art
Intelligent data acquisition for drug design through combinatorial library design
A problem that occurs in machine learning methods for drug discovery is aneed for standardized data. Methods and interest exist for producing new databut due to material and budget constraints it is desirable that each iteration ofproducing data is as efficient as possible. In this thesis, we present two papersmethods detailing different problems for selecting data to produce. We invest-igate Active Learning for models that use the margin in model decisiveness tomeasure the model uncertainty to guide data acquisition. We demonstrate thatthe models perform better with Active Learning than with random acquisitionof data independent of machine learning model and starting knowledge. Wealso study the multi-objective optimization problem of combinatorial librarydesign. Here we present a framework that could process the output of gener-ative models for molecular design and give an optimized library design. Theresults show that the framework successfully optimizes a library based onmolecule availability, for which the framework also attempts to identify usingretrosynthesis prediction. We conclude that the next step in intelligent dataacquisition is to combine the two methods and create a library design modelthat use the information of previous libraries to guide subsequent designs
Utilization of Tags in a Knowledge Bank
In this age of information, a person is accustomed to on demand and within seconds retrieve all thinkable information between heaven and earth with the help of a technical device. Itâs therefore important that information is stored and presented in such way that a person can embrace it with ease. The IT company Nilex, where this research took place, provides such a storage in the form of a knowledge bank. The premise for this thesis is to investigate how information from a knowledge bank can be displayed and found with the principles of interaction design. A centralized focus of the research was if tagging functionality could be a positive effect for this purpose. This thesis describes the problem identification phase followed by the solution design phase which also includes testing of these proposals. The conclusions drawn from the discussion of each method were essential to move the project forward. The testing was performed in two phases, first without tagging functionality and after with, to try to answer if the hypotheses surrounding tags could be answered. No tests or implementations to a real software were done during the research but conclusion from the tests on the design proposals indicates that tags may have a positive effect which makes it applicated to a real environment
The process of product development in small agricultural firms
This study provides a better understanding of small-scale farmers New Product Development (NPD) processes, what activities farmers undertake within and how they work with NPD processes. NPD plays a crucial role in creating and maintaining competitiveness in many industries and its importance has grown over the years due to changing market conditions, also in the agricultural sector. One of the most critical and most important tasks for a firm is the launch and development of new products, moreover, the high frequency of failure of the development of new food products, 72-88% fail at launch, shows that the knowledge of practices for developing new food products needs to be expanded. There is a great potential for the Swedish lamb production to conduct NPD processes, with an animal that produces three different base materials; wool, meat and sheepskin, and having an increased demand for newly developed products.
Previous studies have focused on NPD in large firms and on mostly industrial products. The food product sector has been neglected, which has resulted in a paucity of studies on how small firms can incorporate and work with NPD practices. Moreover, previous research has tried to identify success factors and âthe best practicesâ. In those studies, the firm context is treated as static and over-simplified, however, this study sheds light on the matter of NPD processes through a farmersâ perspective, fulfilling a gap of knowledge on NPD in the agricultural context. The findings from this study can provide empirical insights that might be valuable to fulfil governmentâs and policymakersâ wishes to increase the innovation rate in the agricultural sector of Sweden.
A conceptual framework was developed based on the stage-gate model and a literature review of factors for success when firms engage in innovation projects. By using a qualitative approach based on semi-structured interviews, a multiple case study was conducted on eleven lamb producers on the island of Gotland in Sweden. A conclusion is that the interviewed farmers are most engaged in the activities; development, testing and validation, and consumer relationship when launching the product, and less engaged in scoping, building a business case and post launch review. Moreover, the farmers are more involved in processes within firm- and productrelated factors and are less involved in activities that relate to project- and market-related factors
Riemann-Hilbert approach to multi-time processes; the Airy and the Pearcey case
We prove that matrix Fredholm determinants related to multi-time processes
can be expressed in terms of determinants of integrable kernels \`a la
Its-Izergin-Korepin-Slavnov (IIKS) and hence related to suitable
Riemann-Hilbert problems, thus extending the known results for the single-time
case. We focus on the Airy and Pearcey processes. As an example of applications
we re-deduce a third order PDE, found by Adler and van Moerbeke, for the
two-time Airy process.Comment: 18 pages, 1 figur
Constrained simulations of the Antennae Galaxies: Comparison with Herschel-PACS observations
We present a set of hydro-dynamical numerical simulations of the Antennae
galaxies in order to understand the origin of the central overlap starburst.
Our dynamical model provides a good match to the observed nuclear and overlap
star formation, especially when using a range of rather inefficient stellar
feedback efficiencies (0.01 < q_EoS < 0.1). In this case a simple conversion of
local star formation to molecular hydrogen surface density motivated by
observations accounts well for the observed distribution of CO. Using radiative
transfer post-processing we model synthetic far-infrared spectral energy
distributions (SEDs) and two-dimensional emission maps for direct comparison
with Herschel-PACS observations. For a gas-to-dust ratio of 62:1 and the best
matching range of stellar feedback efficiencies the synthetic far-infrared SEDs
of the central star forming region peak at values of ~65 - 81 Jy at 99 - 116
um, similar to a three-component modified black body fit to infrared
observations. Also the spatial distribution of the far-infrared emission at 70
um, 100 um, and 160 um compares well with the observations: >50% (> 35%) of the
emission in each band is concentrated in the overlap region while only < 30% (<
15%) is distributed to the combined emission from the two galactic nuclei in
the simulations (observations). As a proof of principle we show that parameter
variations in the feedback model result in unambiguous changes both in the
global and in the spatially resolved observable far-infrared properties of
Antennae galaxy models. Our results strengthen the importance of direct,
spatially resolved comparative studies of matched galaxy merger simulations as
a valuable tool to constrain the fundamental star formation and feedback
physics.Comment: 17 pages, 8 figures, 4 tables, submitted to MNRAS, including
revisions after first referee report, comments welcom
Cultural evolution leads to vocal iconicity in an experimental iterated learning task
Experimental and cross-linguistic studies have shown that vocal iconicity is prevalent in words that carry meanings related to size and shape. Although these studies demonstrate the importance of vocal iconicity and reveal the cognitive biases underpinning it, there is less work demonstrating how these biases lead to the evolution of a sound symbolic lexicon in the first place. In this study, we show how words can be shaped by cognitive biases through cultural evolution. Using a simple experimental setup resembling the game telephone, we examined how a single word form changed as it was passed from one participant to the next by a process of immediate iterated learning. About 1,500 naĂŻve participants were recruited online and divided into five condition groups. The participants in the control-group received no information about the meaning of the word they were about to hear, while the participants in the remaining four groups were informed that the word meant either big or small (with the meaning being presented in text), or round or pointy (with the meaning being presented as a picture). The first participant in a transmission chain was presented with a phonetically diverse word and asked to repeat it. Thereafter, the recording of the repeated word was played for the next participant in the same chain. The sounds of the audio recordings were then transcribed and categorized according to six binary sound parameters. By modelling the proportion of vowels or consonants for each sound parameter, the small-condition showed increases of front unrounded vowels and the pointy-condition increases of acute consonants. The results show that linguistic transmission is sufficient for vocal iconicity to emerge, which demonstrates the role non-arbitrary associations play in the evolution of language
Tensile strain-induced softening of iron at high temperature
In weakly ferromagnetic materials, already small changes in the atomic
configuration triggered by temperature or chemistry can alter the magnetic
interactions responsible for the non-random atomic-spin orientation. Different
magnetic states, in turn, can give rise to substantially different macroscopic
properties. A classical example is iron, which exhibits a great variety of
properties as one gradually removes the magnetic long-range order by raising
the temperature towards and beyond its Curie point of
\,K. Using first-principles theory, here we demonstrate
that uniaxial tensile strain can also destabilize the magnetic order in iron
and eventually lead to a ferromagnetic to paramagnetic transition at
temperatures far below . In consequence, the intrinsic
strength of the ideal single-crystal body-centered cubic iron dramatically
weakens above a critical temperature of \,K. The discovered
strain-induced magneto-mechanical softening provides a plausible atomic-level
mechanism behind the observed drop of the measured strength of Fe whiskers
around \,K. Alloying additions which have the capability to partially
restore the magnetic order in the strained Fe lattice, push the critical
temperature for the strength-softening scenario towards the magnetic transition
temperature of the undeformed lattice. This can result in a surprisingly large
alloying-driven strengthening effect at high temperature as illustrated here in
the case of Fe-Co alloy.Comment: 3 figure
Risk Assessment Model Applied on Building Physics: Statistical Data Acquisition and Stochastic Modeling of Indoor Moisture Supply in Swedish Multi-family Dwellings
Though it is highly appreciated and asked for by the practitioners there is a lack of tools to perform proper risk assessment and risk management procedures in the area of building physics. Many of the influential variables, such as outdoor temperature and indoor moisture supply, have stochastic variations, thus a general approach for risk assessment is complicated. The aim of this study is to define risk concepts in building physics and develop a risk assessment model to be used in the field. The study is based on hazard identification tools used in process industry, such as What-if, HAZOP, FMEA and VMEA. The tools are compared and used in the modeling process which leads to identification of noise factors during design, construction and service life. A literature survey is conducted in order to find statistical input data that should be used in the applicability study, based on stochastic simulations and air flow path modeling in CONTAM. By combining the hazards and safeguards in a scenario, together with Monte Carlo simulations, gives results with a distribution, dependent on the variability of the noise factors. The applicability study shows good correspondence with measurements performed on the indoor moisture supply in Swedish multi-family dwellings. Risk and safe scenarios are defined by comparing the result of the scenario with an allowed level of consequences. By implementing risk management into building physics design, it is possible to indentify critical points to avoid extra unwanted costs. In addition, risks concerning indoor climate, health and durability are clarified
Risk Assessment Model Applied on Building Physics: Statistical Data Acquisition and Stochastic Modeling of Indoor Moisture Supply in Swedish Multi-family Dwellings
Though it is highly appreciated and asked for by the practitioners there is a lack of tools to perform proper risk assessment and risk management procedures in the area of building physics. Many of the influential variables, such as outdoor temperature and indoor moisture supply, have stochastic variations, thus a general approach for risk assessment is complicated. The aim of this study is to define risk concepts in building physics and develop a risk assessment model to be used in the field. The study is based on hazard identification tools used in process industry, such as What-if, HAZOP, FMEA and VMEA. The tools are compared and used in the modeling process which leads to identification of noise factors during design, construction and service life. A literature survey is conducted in order to find statistical input data that should be used in the applicability study, based on stochastic simulations and air flow path modeling in CONTAM. By combining the hazards and safeguards in a scenario, together with Monte Carlo simulations, gives results with a distribution, dependent on the variability of the noise factors. The applicability study shows good correspondence with measurements performed on the indoor moisture supply in Swedish multi-family dwellings. Risk and safe scenarios are defined by comparing the result of the scenario with an allowed level of consequences. By implementing risk management into building physics design, it is possible to indentify critical points to avoid extra unwanted costs. In addition, risks concerning indoor climate, health and durability are clarified
- âŠ